Recurrent Networks: State Machines Or Iterated Function Systems?
نویسندگان
چکیده
words, the recurrent network states are not IP states in of themselves; they require an appropriate context which can elevate them to IP-hood. This context consists of a set of input sequences and an observation method for generating outputs. While the recurrent network’s state dynamics may be described as an IFS, any IP interpretation will involve a holistic combination of the set of possible inputs, the state dynamics, and the output generation mechanism of a network.
منابع مشابه
Recurrent Neural Networks with Iterated Function Systems Dynamics
We suggest a recurrent neural network (RNN) model with a recurrent part corresponding to iterative function systems (IFS) introduced by Barnsley [1] as a fractal image compression mechanism. The key idea is that 1) in our model we avoid learning the RNN state part by having non-trainable connections between the context and recurrent layers (this makes the training process less problematic and f...
متن کاملLogic programs, iterated function systems, and recurrent radial basis function networks
Graphs of the single-step operator for first-order logic programs — displayed in the real plane — exhibit self-similar structures known from topological dynamics, i.e. they appear to be fractals, or more precisely, attractors of iterated function systems. We show that this observation can be made mathematically precise. In particular, we give conditions which ensure that those graphs coincide w...
متن کاملRotation number and its properties for iterated function and non-autonomous systems
The main purpose of this paper is to introduce the rotation number for non-autonomous and iterated function systems. First, we define iterated function systems and the lift of these types of systems on the unit circle. In the following, we define the rotation number and investigate the conditions of existence and uniqueness of this number for our systems. Then, the notions rotational entropy an...
متن کاملLongest Path in Networks of Queues in the Steady-State
Due to the importance of longest path analysis in networks of queues, we develop an analytical method for computing the steady-state distribution function of longest path in acyclic networks of queues. We assume the network consists of a number of queuing systems and each one has either one or infinite servers. The distribution function of service time is assumed to be exponential or Erlang. Fu...
متن کاملA Gradient Descent Method for a Neural Fractal Memory
It has been demonstrated that higher order recurrent neural networks exhibit an underlying fractal attractor as an artifact of their dynamics. These fractal attractors o er a very e cent mechanism to encode visual memories in a neural substrate, since even a simple twelve weight network can encode a very large set of di erent images. The main problem in this memory model, which so far has remai...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1994